Skip to content
This repository has been archived by the owner on Nov 3, 2023. It is now read-only.

[Torch] Model Parallel Customization #2839

Merged
merged 2 commits into from
Jul 23, 2020

Conversation

klshuster
Copy link
Contributor

Patch description
2 very small changes to the model parallel Pipeline helper

  1. Provide a way to exempt module lists from model parallel
  2. Change devices considered (so as not to violate later assertion)

Note that my change is 2 lines. The additional 3 are black

Testing steps
Added a test to the test_utils_torch to check model parallel exemption. I am honestly not 100% sure how to test my second change.

Logs

 python -m pytest tests/test_utils_torch.py
============================================================================== test session starts ===============================================================================
platform linux -- Python 3.6.9, pytest-5.3.2, py-1.8.1, pluggy-0.13.1
rootdir: /private/home/kshuster/ParlAI, inifile: pytest.ini
plugins: requests-mock-1.7.0
collected 18 items

tests/test_utils_torch.py ..................                                                                                                                               [100%]

=========================================================================== slowest 10 test durations ============================================================================
0.01s call     tests/test_utils_torch.py::TestPipelineHelper::test_model_parallel_exempt

(0.00 durations hidden.  Use -vv to show these durations.)
========================================================================= 18 passed, 1 warning in 1.50s ==========================================================================

Copy link
Contributor

@stephenroller stephenroller left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Seems reasonable. Another option would be that we define:

class ModelParallelizeableModuleList(ModuleList):
    pass

and have make_parallel only hit modules of this instance.

@klshuster
Copy link
Contributor Author

Yes I suppose that would require MP to be opt-in rather than opt-out. Happy to explore that in future PRs, i think this is a reasonable solution for now - each solution requires some intimate knowledge of ParlAI anyway

@klshuster klshuster merged commit 1b0ef77 into facebookresearch:master Jul 23, 2020
@klshuster klshuster deleted the mp_changes branch July 23, 2020 21:12
@stephenroller
Copy link
Contributor

Model parallel is explicitly opt-in anyway. The forward method has to be amenable to it.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants